1,511 research outputs found

    LEDAkem: a post-quantum key encapsulation mechanism based on QC-LDPC codes

    Full text link
    This work presents a new code-based key encapsulation mechanism (KEM) called LEDAkem. It is built on the Niederreiter cryptosystem and relies on quasi-cyclic low-density parity-check codes as secret codes, providing high decoding speeds and compact keypairs. LEDAkem uses ephemeral keys to foil known statistical attacks, and takes advantage of a new decoding algorithm that provides faster decoding than the classical bit-flipping decoder commonly adopted in this kind of systems. The main attacks against LEDAkem are investigated, taking into account quantum speedups. Some instances of LEDAkem are designed to achieve different security levels against classical and quantum computers. Some performance figures obtained through an efficient C99 implementation of LEDAkem are provided.Comment: 21 pages, 3 table

    Chirped pulse Raman amplification in warm plasma: towards controlling saturation

    Get PDF
    Stimulated Raman backscattering in plasma is potentially an efficient method of amplifying laser pulses to reach exawatt powers because plasma is fully broken down and withstands extremely high electric fields. Plasma also has unique nonlinear optical properties that allow simultaneous compression of optical pulses to ultra-short durations. However, current measured efficiencies are limited to several percent. Here we investigate Raman amplification of short duration seed pulses with different chirp rates using a chirped pump pulse in a preformed plasma waveguide. We identify electron trapping and wavebreaking as the main saturation mechanisms, which lead to spectral broadening and gain saturation when the seed reaches several millijoules for durations of 10's - 100's fs for 250 ps, 800 nm chirped pump pulses. We show that this prevents access to the nonlinear regime and limits the efficiency, and interpret the experimental results using slowly-varying-amplitude, current-averaged particle-in-cell simulations. We also propose methods for achieving higher efficiencies.close0

    The Newcomb-Benford Law in Its Relation to Some Common Distributions

    Get PDF
    An often reported, but nevertheless persistently striking observation, formalized as the Newcomb-Benford law (NBL), is that the frequencies with which the leading digits of numbers occur in a large variety of data are far away from being uniform. Most spectacular seems to be the fact that in many data the leading digit 1 occurs in nearly one third of all cases. Explanations for this uneven distribution of the leading digits were, among others, scale- and base-invariance. Little attention, however, found the interrelation between the distribution of the significant digits and the distribution of the observed variable. It is shown here by simulation that long right-tailed distributions of a random variable are compatible with the NBL, and that for distributions of the ratio of two random variables the fit generally improves. Distributions not putting most mass on small values of the random variable (e.g. symmetric distributions) fail to fit. Hence, the validity of the NBL needs the predominance of small values and, when thinking of real-world data, a majority of small entities. Analyses of data on stock prices, the areas and numbers of inhabitants of countries, and the starting page numbers of papers from a bibliography sustain this conclusion. In all, these findings may help to understand the mechanisms behind the NBL and the conditions needed for its validity. That this law is not only of scientific interest per se, but that, in addition, it has also substantial implications can be seen from those fields where it was suggested to be put into practice. These fields reach from the detection of irregularities in data (e.g. economic fraud) to optimizing the architecture of computers regarding number representation, storage, and round-off errors

    APE: Authenticated Permutation-Based Encryption for Lightweight Cryptography

    Get PDF
    The domain of lightweight cryptography focuses on cryptographic algorithms for extremely constrained devices. It is very costly to avoid nonce reuse in such environments, because this requires either a hardware source of randomness, or non-volatile memory to store a counter. At the same time, a lot of cryptographic schemes actually require the nonce assumption for their security. In this paper, we propose APE as the first permutation-based authenticated encryption scheme that is resistant against nonce misuse. We formally prove that APE is secure, based on the security of the underlying permutation. To decrypt, APE processes the ciphertext blocks in reverse order, and uses inverse permutation calls. APE therefore requires a permutation that is both efficient for forward and inverse calls. We instantiate APE with the permutations of three recent lightweight hash function designs: Quark, Photon, and Spongent. For any of these permutations, an implementation that sup- ports both encryption and decryption requires less than 1.9 kGE and 2.8 kGE for 80-bit and 128-bit security levels, respectively

    The leading digit distribution of the worldwide Illicit Financial Flows

    Full text link
    Benford's law states that in data sets from different phenomena leading digits tend to be distributed logarithmically such that the numbers beginning with smaller digits occur more often than those with larger ones. Particularly, the law is known to hold for different types of financial data. The Illicit Financial Flows (IFFs) exiting the developing countries are frequently discussed as hidden resources which could have been otherwise properly utilized for their development. We investigate here the distribution of the leading digits in the recent data on estimates of IFFs to look for the existence of a pattern as predicted by Benford's law and establish that the frequency of occurrence of the leading digits in these estimates does closely follow the law.Comment: 13 pages, 10 figures, 6 tables, additional data analyi

    Boomerang Connectivity Table:A New Cryptanalysis Tool

    Get PDF
    A boomerang attack is a cryptanalysis framework that regards a block cipher EE as the composition of two sub-ciphers E1E0E_1\circ E_0 and builds a particular characteristic for EE with probability p2q2p^2q^2 by combining differential characteristics for E0E_0 and E1E_1 with probability pp and qq, respectively. Crucially the validity of this figure is under the assumption that the characteristics for E0E_0 and E1E_1 can be chosen independently. Indeed, Murphy has shown that independently chosen characteristics may turn out to be incompatible. On the other hand, several researchers observed that the probability can be improved to pp or qq around the boundary between E0E_0 and E1E_1 by considering a positive dependency of the two characteristics, e.g.~the ladder switch and S-box switch by Biryukov and Khovratovich. This phenomenon was later formalised by Dunkelman et al.~as a sandwich attack that regards EE as E1EmE0E_1\circ E_m \circ E_0, where EmE_m satisfies some differential propagation among four texts with probability rr, and the entire probability is p2q2rp^2q^2r. In this paper, we revisit the issue of dependency of two characteristics in EmE_m, and propose a new tool called Boomerang Connectivity Table (BCT), which evaluates rr in a systematic and easy-to-understand way when EmE_m is composed of a single S-box layer. With the BCT, previous observations on the S-box including the incompatibility, the ladder switch and the S-box switch are represented in a unified manner. Moreover, the BCT can detect a new switching effect, which shows that the probability around the boundary may be even higher than pp or qq. To illustrate the power of the BCT-based analysis, we improve boomerang attacks against Deoxys-BC, and disclose the mechanism behind an unsolved probability amplification for generating a quartet in SKINNY. Lastly, we discuss the issue of searching for S-boxes having good BCT and extending the analysis to modular addition

    Supersymmetric Aether

    Full text link
    It has been suggested by Groot Nibbelink and Pospelov that Lorentz invariance can be an emergent symmetry of low-energy physics provided the theory enjoys a non-relativistic version of supersymmetry. We construct a model that realizes the latter symmetry dynamically: it breaks Lorentz invariance but leaves the supersymmetry generators intact. The model is a supersymmetric extension of the dynamical aether theory of Jacobson and Mattingly. It shows rich dynamics and possesses a family of inequivalent vacua realizing different symmetry breaking patterns. In particular, we find stable vacua that break spontaneously spatial isotropy. Supersymmetry breaking terms give masses to fermionic and bosonic partners of the aether field. We comment on the coupling of the model to supergravity and on the implications for Horava gravity.Comment: 21 pages, no figure

    Genome Sizes and the Benford Distribution

    Get PDF
    BACKGROUND: Data on the number of Open Reading Frames (ORFs) coded by genomes from the 3 domains of Life show the presence of some notable general features. These include essential differences between the Prokaryotes and Eukaryotes, with the number of ORFs growing linearly with total genome size for the former, but only logarithmically for the latter. RESULTS: Simply by assuming that the (protein) coding and non-coding fractions of the genome must have different dynamics and that the non-coding fraction must be particularly versatile and therefore be controlled by a variety of (unspecified) probability distribution functions (pdf's), we are able to predict that the number of ORFs for Eukaryotes follows a Benford distribution and must therefore have a specific logarithmic form. Using the data for the 1000+ genomes available to us in early 2010, we find that the Benford distribution provides excellent fits to the data over several orders of magnitude. CONCLUSIONS: In its linear regime the Benford distribution produces excellent fits to the Prokaryote data, while the full non-linear form of the distribution similarly provides an excellent fit to the Eukaryote data. Furthermore, in their region of overlap the salient features are statistically congruent. This allows us to interpret the difference between Prokaryotes and Eukaryotes as the manifestation of the increased demand in the biological functions required for the larger Eukaryotes, to estimate some minimal genome sizes, and to predict a maximal Prokaryote genome size on the order of 8-12 megabasepairs. These results naturally allow a mathematical interpretation in terms of maximal entropy and, therefore, most efficient information transmission

    HER2 therapy. HER2 (ERBB2): functional diversity from structurally conserved building blocks

    Get PDF
    EGFR-type receptor tyrosine kinases achieve a broad spectrum of cellular responses by utilizing a set of structurally conserved building blocks. Based on available crystal structures and biochemical information, significant new insights have emerged into modes of receptor control, its deregulation in cancer, and the nuances that differentiate the four human receptors. This review gives an overview of current models of the control of receptor activity with a special emphasis on HER2 and HER3
    corecore